Constrained-Storage Vector Quantization with a Universal Codebook

نویسندگان

  • Sangeeta Ramakrishnan
  • Kenneth Rose
  • Allen Gersho
چکیده

Many image compression techniques require the quantization of multiple vector sources with significantly different distributions. With vector quantization (VQ), these sources are optimally quantized using separate codebooks, which may collectively require an enormous memory space. Since storage is limited in most applications, a convenient way to gracefully trade between performance and storage is needed. Earlier work addressed this problem by clustering the multiple sources into a small number of source groups, where each group shares a codebook. We propose a new solution based on a size-limited universal codebook that can be viewed as the union of overlapping source codebooks. This framework allows each source codebook to consist of any desired subset of the universal code vectors and provides greater design flexibility which improves the storage-constrained performance. A key feature of this approach is that no two sources need be encoded at the same rate. An additional advantage of the proposed method is its close relation to universal, adaptive, finite-state and classified quantization. Necessary conditions for optimality of the universal codebook and the extracted source codebooks are derived. An iterative design algorithm is introduced to obtain a solution satisfying these conditions. Possible applications of the proposed technique are enumerated, and its effectiveness is illustrated for coding of images using finite-state vector quantization, multistage vector quantization, and tree-structured vector quantization.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Constrained-Storage Vector Quantization With A Universal Codebook - Image Processing, IEEE Transactions on

Many image compression techniques require the quantization of multiple vector sources with significantly different distributions. With vector quantization (VQ), these sources are optimally quantized using separate codebooks, which may collectively require an enormous memory space. Since storage is limited in most applications, a convenient way to gracefully trade between performance and storage...

متن کامل

Variable-length constrained-storage tree-structured vector quantization

Constrained storage vector quantization, (CSVQ), introduced by Chan and Gersho (1990, 1991) allows for the stagewise design of balanced tree-structured residual vector quantization codebooks with low encoding and storage complexities. On the other hand, it has been established by Makhoul et al. (1985), Riskin et al. (1991), and by Mahesh et al. (see IEEE Trans. Inform. Theory, vol.41, p.917-30,...

متن کامل

Variable Dimension Vector Quantization of Speech Spectra for Low Rate Vocoders

Optimal vector quantization of variable-dimension vectors in principle is feasible by using a set of fixed dimension VQ codebooks. However, for typical applications, such a multi-codebook approach demands a grossly excessive and impractical storage and computational complexity. Efficient quantization of such variable-dimension spectral shape vectors is the most challenging and difficult encodin...

متن کامل

Linear-translate constrained storage VQ for VSPIHT wavelet image compression

A new Constrained Storage VQ (CSVQ) structure based on linear transforms and translates of a common root codebook is proposed. The new VQ structure, named LT-CSVQ (Linear Translate CSVQ), acts as a building block for multistage VQ implementations (LT-CS-MSVQ), and significantly reduces storage requirements from that required in tree-multistage VQ implementations. LT-CS-MSVQ is most appropriate ...

متن کامل

Using Vector Quantization for Universal Background Model in Automatic Speaker Verification

We aim to describe different approaches for vector quantization in Automatic Speaker Verification. We designed our novel architecture based on multiples codebook representing the speakers and the impostor model called universal background model and compared it to another vector quantization approach used for reducing training data. We compared our scheme with the baseline system, Gaussian Mixtu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on image processing : a publication of the IEEE Signal Processing Society

دوره 7 6  شماره 

صفحات  -

تاریخ انتشار 1995